Analysis of boosting algorithms using the smooth margin function

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Refined Margin Analysis for Boosting Algorithms via Equilibrium Margin

Much attention has been paid to the theoretical explanation of the empirical success of AdaBoost. The most influential work is the margin theory, which is essentially an upper bound for the generalization error of any voting classifier in terms of the margin distribution over the training data. However, important questions were raised about the margin explanation. Breiman (1999) proved a bound ...

متن کامل

On the Margin Explanation of Boosting Algorithms

Much attention has been paid to the theoretical explanation of the empirical success of AdaBoost. The most influential work is the margin theory, which is essentially an upper bound for the generalization error of any voting classifier in terms of the margin distribution over the training data. However, Breiman raised important questions about the margin explanation by developing a boosting alg...

متن کامل

Boosting Algorithms for Maximizing the Soft Margin

Algorithm 1: SoftBoost 1. Input: S = 〈(x1, y1), . . . , (xN , yN )〉, desired accuracy δ, and capping parameter ν ∈ [1, N ]. 2. Initialize: dn to the uniform distribution 3.Do for t = 1, . . . (a) Train classifier on dt−1 and {u1, . . . ,ut−1} and obtain hypothesis ht. Set un = h(xn)yn. (b) Calculate the edge γt of ht : γt = dt · ut (c) Set γ̂t = (minm=1...t γm)− δ (d) Set γ∗ = solution to the pr...

متن کامل

Smooth Boosting for Margin-Based Ranking

We propose a new boosting algorithm for bipartite ranking problems. Our boosting algorithm, called SoftRankBoost, is a modification of RankBoost which maintains only smooth distributions over data. SoftRankBoost provably achieves approximately the maximum soft margin over all pairs of positive and negative examples, which implies high AUC score for future data.

متن کامل

Boosting Based on a Smooth Margin

We study two boosting algorithms, Coordinate Ascent Boosting and Approximate Coordinate Ascent Boosting, which are explicitly designed to produce maximum margins. To derive these algorithms, we introduce a smooth approximation of the margin that one can maximize in order to produce a maximum margin classifier. Our first algorithm is simply coordinate ascent on this function, involving a line se...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: The Annals of Statistics

سال: 2007

ISSN: 0090-5364

DOI: 10.1214/009053607000000785